bit-slice microprocessor - перевод на русский
Diclib.com
Словарь онлайн

bit-slice microprocessor - перевод на русский

COMPUTER TECHNIQUE
Bit slice; Bit slice processor; Bit-slice; Bitslice microprocessor technology; Bit-slice processor; Bit slice processors; ALU slice; Slice ALU; ALU Slice; Bit-slice ALU; Bit-slice CPU; Bit slice ALU; Bit slice CPU; 2-bit architecture; Bit-slicing; Slice cascadable processor; Slice processor; Cascadable ALU; Bit slice microprocessor; Bit-slice microprocessor; Bitslice; Bitslicing; 2-bit computing

bit-slice microprocessor         
разрядно-секционированный микропроцессор
bit slice         

общая лексика

секционный [процессор]

широко применявшийся в 1980-х годах метод конструирования процессора из так называемых процессорных секций - независимых процессоров, обрабатывающих 1, 2, 4 или 8 разрядов данных

Смотрите также

ALU; CPU

bit length         
NUMBER OF BINARY DIGITS (BITS), NECESSARY TO REPRESENT AN INTEGER IN THE BINARY NUMBER SYSTEM
Bit length; Bit width
длина (сообщения) в битах

Определение

bit slice
<architecture> A technique for constructing a processor from modules, each of which processes one bit-field or "slice" of an operand. Bit slice processors usually consist of an ALU of 1, 2, 4 or 8 bits and control lines (including carry or overflow signals usually internal to the CPU). For example, two 4-bit ALUs could be arranged side by side, with control lines between them, to form an 8-bit ALU. A sequencer executes a program to provide data and control signals. The AMD Am2901 is an example. (1994-11-15)

Википедия

Bit slicing

Bit slicing is a technique for constructing a processor from modules of processors of smaller bit width, for the purpose of increasing the word length; in theory to make an arbitrary n-bit central processing unit (CPU). Each of these component modules processes one bit field or "slice" of an operand. The grouped processing components would then have the capability to process the chosen full word-length of a given software design.

Bit slicing more or less died out due to the advent of the microprocessor. Recently it has been used in arithmetic logic units (ALUs) for quantum computers and as a software technique, e.g. for cryptography in x86 CPUs.